Growing demand on dark web for AI abuse images
New study found a clear desire among online offenders to learn new technologies
There is clear evidence of a growing demand for AI-generated images of child sexual abuse on the dark web, according to a new research report published by Anglia Ruskin University’s International Policing and Public Protection Research Institute (IPPPRI).
The innovative study of the ‘dark web’ seeks to understand how online offenders are using artificial intelligence (AI) to create child sexual abuse material (CSAM).
The ‘IPPPRI Insights’ publication comes after the Internet Watch Foundation shared a report highlighting the continued growth of this emerging technology as a tool to exploit children.
Researchers Dr Deanna Davy and Professor Sam Lundrigan analysed chats that had taken place in dark web forums over the past 12 months, and found clear evidence of the growing interest in this technology, the continued use of AI, and the collective desire on the part of online offenders for others to learn more and create new abuse imagery.
Dr Davy explained:
The research, part of a wider programme of work into tech-facilitation of child sexual abuse funded by the Dawes Trust, found that forum members are actively teaching themselves how to create AI-generate CSAM via accessing guides and videos online, and sharing advice and guidance amongst themselves.
Analysis also showed that forum members are using their own supply of non-AI generated images and videos already at their disposal to facilitate their learning, and that many shared their hopes and expectations that the technology would evolve, making it even easier for them to create this material. Some forum members also referred to those creating the AI-imagery as “artists”.
Dr Davy concluded:
“There is a misconception that AI-generated images are ‘victimless’ and this could not be further from the truth. We found that many of the offenders are sourcing images of children in order to manipulate them, and that the desire for ‘hardcore’ imagery, escalating from ‘softcore’ is regularly discussed."
Professor Lundrigan added:
The full report can be viewed here.